AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
1.7T pretraining

# 1.7T pretraining

Tanuki 8x8B Dpo V1.0
Apache-2.0
Tanuki-8x8B is a large-scale language model pretrained from scratch, optimized for dialogue tasks through SFT and DPO
Large Language Model Transformers Supports Multiple Languages
T
weblab-GENIAC
217
38
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase